AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Text pre-training

# Text pre-training

Hebrew Gemma 11B V2
Other
Hebrew-Gemma-11B-V2 is an open-source Hebrew/English pre-trained generative text large language model with 11 billion parameters, based on Google's Gemma-7B architecture.
Large Language Model Transformers Supports Multiple Languages
H
yam-peleg
5,292
13
Distilbert Portuguese Cased
This is a distilled Portuguese BERT model derived from BERTimbau, retaining 99% of the original model's accuracy.
Large Language Model Transformers Other
D
adalbertojunior
317
17
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase